255 research outputs found

    Formal Verification of Probabilistic SystemC Models with Statistical Model Checking

    Full text link
    Transaction-level modeling with SystemC has been very successful in describing the behavior of embedded systems by providing high-level executable models, in which many of them have inherent probabilistic behaviors, e.g., random data and unreliable components. It thus is crucial to have both quantitative and qualitative analysis of the probabilities of system properties. Such analysis can be conducted by constructing a formal model of the system under verification and using Probabilistic Model Checking (PMC). However, this method is infeasible for large systems, due to the state space explosion. In this article, we demonstrate the successful use of Statistical Model Checking (SMC) to carry out such analysis directly from large SystemC models and allow designers to express a wide range of useful properties. The first contribution of this work is a framework to verify properties expressed in Bounded Linear Temporal Logic (BLTL) for SystemC models with both timed and probabilistic characteristics. Second, the framework allows users to expose a rich set of user-code primitives as atomic propositions in BLTL. Moreover, users can define their own fine-grained time resolution rather than the boundary of clock cycles in the SystemC simulation. The third contribution is an implementation of a statistical model checker. It contains an automatic monitor generation for producing execution traces of the model-under-verification (MUV), the mechanism for automatically instrumenting the MUV, and the interaction with statistical model checking algorithms.Comment: Journal of Software: Evolution and Process. Wiley, 2017. arXiv admin note: substantial text overlap with arXiv:1507.0818

    Dependability Analysis of Control Systems using SystemC and Statistical Model Checking

    Get PDF
    Stochastic Petri nets are commonly used for modeling distributed systems in order to study their performance and dependability. This paper proposes a realization of stochastic Petri nets in SystemC for modeling large embedded control systems. Then statistical model checking is used to analyze the dependability of the constructed model. Our verification framework allows users to express a wide range of useful properties to be verified which is illustrated through a case study

    Bounded Expectations: Resource Analysis for Probabilistic Programs

    Full text link
    This paper presents a new static analysis for deriving upper bounds on the expected resource consumption of probabilistic programs. The analysis is fully automatic and derives symbolic bounds that are multivariate polynomials of the inputs. The new technique combines manual state-of-the-art reasoning techniques for probabilistic programs with an effective method for automatic resource-bound analysis of deterministic programs. It can be seen as both, an extension of automatic amortized resource analysis (AARA) to probabilistic programs and an automation of manual reasoning for probabilistic programs that is based on weakest preconditions. As a result, bound inference can be reduced to off-the-shelf LP solving in many cases and automatically-derived bounds can be interactively extended with standard program logics if the automation fails. Building on existing work, the soundness of the analysis is proved with respect to an operational semantics that is based on Markov decision processes. The effectiveness of the technique is demonstrated with a prototype implementation that is used to automatically analyze 39 challenging probabilistic programs and randomized algorithms. Experimental results indicate that the derived constant factors in the bounds are very precise and even optimal for many programs

    Dynamic Verification of SystemC with Statistical Model Checking

    Get PDF
    Many embedded and real-time systems have a inherent probabilistic behaviour (sensors data, unreliable hardware,...). In that context, it is crucial to evaluate system properties such as "the probability that a particular hardware fails". Such properties can be evaluated by using probabilistic model checking. However, this technique fails on models representing realistic embedded and real-time systems because of the state space explosion. To overcome this problem, we propose a verification framework based on Statistical Model Checking. Our framework is able to evaluate probabilistic and temporal properties on large systems modelled in SystemC, a standard system-level modelling language. It is fully implemented as an extension of the Plasma-lab statistical model checker. We illustrate our approach on a multi-lift system case study

    Verifying and Synthesizing Constant-Resource Implementations with Types

    Full text link
    We propose a novel type system for verifying that programs correctly implement constant-resource behavior. Our type system extends recent work on automatic amortized resource analysis (AARA), a set of techniques that automatically derive provable upper bounds on the resource consumption of programs. We devise new techniques that build on the potential method to achieve compositionality, precision, and automation. A strict global requirement that a program always maintains constant resource usage is too restrictive for most practical applications. It is sufficient to require that the program's resource behavior remain constant with respect to an attacker who is only allowed to observe part of the program's state and behavior. To account for this, our type system incorporates information flow tracking into its resource analysis. This allows our system to certify programs that need to violate the constant-time requirement in certain cases, as long as doing so does not leak confidential information to attackers. We formalize this guarantee by defining a new notion of resource-aware noninterference, and prove that our system enforces it. Finally, we show how our type inference algorithm can be used to synthesize a constant-time implementation from one that cannot be verified as secure, effectively repairing insecure programs automatically. We also show how a second novel AARA system that computes lower bounds on resource usage can be used to derive quantitative bounds on the amount of information that a program leaks through its resource use. We implemented each of these systems in Resource Aware ML, and show that it can be applied to verify constant-time behavior in a number of applications including encryption and decryption routines, database queries, and other resource-aware functionality.Comment: 30, IEEE S&P 201

    Formal Verification of Synchronous Data-flow Program Transformations Toward Certified Compilers

    Get PDF
    International audienceTranslation validation was introduced in the 90's by Pnueli et al. as a technique to formally verify the correctness of code generators. Rather than certifying the code generator or exhaustively qualifying it, translation validators attempt to verify that program transformations preserve semantics. In this work, we adopt this approach to formally verify that the clock semantics and data dependence are preserved during the compilation of the Signal compiler. Translation validation is implemented for every compilation phase from the initial phase until the latest phase where the executable code is generated, by proving that the transformation in each phase of the compiler preserves the semantics. We represent the clock semantics, the data dependence of a program and its transformed counterpart as first-order formulas which are called Clock Models and Synchronous Dependence Graphs (SDGs), respectively. Then we introduce clock refinement and dependence refinement relations which express the preservation of clock semantics and dependence, as a relation on clock models and SDGs, respectively. Our validator does not require any instrumentation or modification of the compiler, nor any rewriting of the source program

    Effects of phone versus mail survey methods on the measurement of health-related quality of life and emotional and behavioural problems in adolescents

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Telephone interviews have become established as an alternative to traditional mail surveys for collecting epidemiological data in public health research. However, the use of telephone and mail surveys raises the question of to what extent the results of different data collection methods deviate from one another. We therefore set out to study possible differences in using telephone and mail survey methods to measure health-related quality of life and emotional and behavioural problems in children and adolescents.</p> <p>Methods</p> <p>A total of 1700 German children aged 8-18 years and their parents were interviewed randomly either by telephone or by mail. Health-related Quality of Life (HRQoL) and mental health problems (MHP) were assessed using the KINDL-R Quality of Life instrument and the Strengths and Difficulties Questionnaire (SDQ) children's self-report and parent proxy report versions. Mean Differences ("d" effect size) and differences in Cronbach alpha were examined across modes of administration. Pearson correlation between children's and parents' scores was calculated within a multi-trait-multi-method (MTMM) analysis and compared across survey modes using Fisher-Z transformation.</p> <p>Results</p> <p>Telephone and mail survey methods resulted in similar completion rates and similar socio-demographic and socio-economic makeups of the samples. Telephone methods resulted in more positive self- and parent proxy reports of children's HRQoL (SMD ≤ 0.27) and MHP (SMD ≤ 0.32) on many scales. For the phone administered KINDL, lower Cronbach alpha values (self/proxy Total: 0.79/0.84) were observed (mail survey self/proxy Total: 0.84/0.87). KINDL MTMM results were weaker for the phone surveys: mono-trait-multi-method mean r = 0.31 (mail: r = 0.45); multi-trait-mono-method mean (self/parents) r = 0.29/0.36 (mail: r = 0.34/0.40); multi-trait-multi-method mean r = 0.14 (mail: r = 0.21). Weaker MTMM results were also observed for the phone administered SDQ: mono-trait-multi-method mean r = 0.32 (mail: r = 0.40); multi-trait-mono-method mean (self/parents) r = 0.24/0.30 (mail: r = 0.20/0.32); multi-trait-multi-method mean r = 0.14 (mail = 0.14). The SDQ classification into borderline and abnormal for some scales was affected by the method (OR = 0.36-1.55).</p> <p>Conclusions</p> <p>The observed differences between phone and mail surveys are small but should be regarded as relevant in certain settings. Therefore, while both methods are valid, some changes are necessary. The weaker reliability and MTMM validity associated with phone methods necessitates improved phone adaptations of paper and pencil questionnaires. The effects of phone versus mail survey modes are partly different across constructs/measures.</p

    B cell antigen receptor signal strength and peripheral B cell development are regulated by a 9-O-acetyl sialic acid esterase

    Get PDF
    We show that the enzymatic acetylation and deacetylation of a cell surface carbohydrate controls B cell development, signaling, and immunological tolerance. Mice with a mutation in sialate:O-acetyl esterase, an enzyme that specifically removes acetyl moieties from the 9-OH position of α2–6-linked sialic acid, exhibit enhanced B cell receptor (BCR) activation, defects in peripheral B cell development, and spontaneously develop antichromatin autoantibodies and glomerular immune complex deposits. The 9-O-acetylation state of sialic acid regulates the function of CD22, a Siglec that functions in vivo as an inhibitor of BCR signaling. These results describe a novel catalytic regulator of B cell signaling and underscore the crucial role of inhibitory signaling in the maintenance of immunological tolerance in the B lineage

    The genomes of two key bumblebee species with primitive eusocial organization

    Get PDF
    Background: The shift from solitary to social behavior is one of the major evolutionary transitions. Primitively eusocial bumblebees are uniquely placed to illuminate the evolution of highly eusocial insect societies. Bumblebees are also invaluable natural and agricultural pollinators, and there is widespread concern over recent population declines in some species. High-quality genomic data will inform key aspects of bumblebee biology, including susceptibility to implicated population viability threats. Results: We report the high quality draft genome sequences of Bombus terrestris and Bombus impatiens, two ecologically dominant bumblebees and widely utilized study species. Comparing these new genomes to those of the highly eusocial honeybee Apis mellifera and other Hymenoptera, we identify deeply conserved similarities, as well as novelties key to the biology of these organisms. Some honeybee genome features thought to underpin advanced eusociality are also present in bumblebees, indicating an earlier evolution in the bee lineage. Xenobiotic detoxification and immune genes are similarly depauperate in bumblebees and honeybees, and multiple categories of genes linked to social organization, including development and behavior, show high conservation. Key differences identified include a bias in bumblebee chemoreception towards gustation from olfaction, and striking differences in microRNAs, potentially responsible for gene regulation underlying social and other traits. Conclusions: These two bumblebee genomes provide a foundation for post-genomic research on these key pollinators and insect societies. Overall, gene repertoires suggest that the route to advanced eusociality in bees was mediated by many small changes in many genes and processes, and not by notable expansion or depauperation

    Effects of antiplatelet therapy on stroke risk by brain imaging features of intracerebral haemorrhage and cerebral small vessel diseases: subgroup analyses of the RESTART randomised, open-label trial

    Get PDF
    Background Findings from the RESTART trial suggest that starting antiplatelet therapy might reduce the risk of recurrent symptomatic intracerebral haemorrhage compared with avoiding antiplatelet therapy. Brain imaging features of intracerebral haemorrhage and cerebral small vessel diseases (such as cerebral microbleeds) are associated with greater risks of recurrent intracerebral haemorrhage. We did subgroup analyses of the RESTART trial to explore whether these brain imaging features modify the effects of antiplatelet therapy
    • …
    corecore